Localist Attractor Networks

نویسندگان

  • Richard S. Zemel
  • Michael C. Mozer
چکیده

Attractor networks, which map an input space to a discrete output space, are useful for pattern completion--cleaning up noisy or missing input features. However, designing a net to have a given set of attractors is notoriously tricky; training procedures are CPU intensive and often produce spurious attractors and ill-conditioned attractor basins. These difficulties occur because each connection in the network participates in the encoding of multiple attractors. We describe an alternative formulation of attractor networks in which the encoding of knowledge is local, not distributed. Although localist attractor networks have similar dynamics to their distributed counterparts, they are much easier to work with and interpret. We propose a statistical formulation of localist attractor net dynamics, which yields a convergence proof and a mathematical interpretation of model parameters. We present simulation experiments that explore the behavior of localist attractor networks, showing that they yield few spurious attractors, and they readily exhibit two desirable properties of psychological and neurobiological models: priming (faster convergence to an attractor if the attractor has been recently visited) and gang effects (in which the presence of an attractor enhances the attractor basins of neighboring attractors).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Localist Attractor Networks Submitted to: Neural Computation

Attractor networks, which map an input space to a discrete output space, are useful for pattern completion—cleaning up noisy or missing input features. However, designing a net to have a given set of attractors is notoriously tricky; training procedures are CPU intensive and often produce spurious attractors and ill-conditioned attractor basins. These difficulties occur because each connection ...

متن کامل

A Generative Model for Attractor Dynamics

Attractor networks, which map an input space to a discrete output space, are useful for pattern completion. However, designing a net to have a given set of attractors is notoriously tricky; training procedures are CPU intensive and often produce spurious afuactors and ill-conditioned attractor basins. These difficulties occur because each connection in the network participates in the encoding o...

متن کامل

Spreading Activation in an Attractor Network With Latching Dynamics: Automatic Semantic Priming Revisited

Localist models of spreading activation (SA) and models assuming distributed representations offer very different takes on semantic priming, a widely investigated paradigm in word recognition and semantic memory research. In this study, we implemented SA in an attractor neural network model with distributed representations and created a unified framework for the two approaches. Our models assum...

متن کامل

Analog-symbolic memory that tracks via reconsolidation

A fundamental part of a computational system is its memory, which is used to store and retrieve data. Classical computer memories rely on the static approach and are very different from human memories. Neural network memories are based on auto-associative attractor dynamics and thus provide a high level of pattern completion. However, they are not used in general computation since there are pra...

متن کامل

Representations for a Complex World: Combining Distributed and Localist Representations for Learning and Planning

To have agents autonomously model a complex environment, it is desirable to use distributed representations that lend themselves to neural learning. Yet developing and executing plans acting on the environment calls for abstract, localist representations of events, objects and categories. To combine these requirements, a formalism that can express neural networks, action sequences and symbolic ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural computation

دوره 13 5  شماره 

صفحات  -

تاریخ انتشار 2001